翻訳と辞書
Words near each other
・ Sensitive Skin (UK TV series)
・ Sensitive Skin Magazine
・ Sensitive Sources
・ Sensitive style
・ Sensitive to a Smile (album)
・ Sensitive to a Smile (song)
・ Sensitive urban zone
・ Sensitivity
・ Sensitivity (control systems)
・ Sensitivity (electronics)
・ Sensitivity (explosives)
・ Sensitivity (human)
・ Sensitivity (song)
・ Sensitivity analysis
・ Sensitivity and specificity
Sensitivity auditing
・ Sensitivity index
・ Sensitivity priority
・ Sensitivity speck
・ Sensitivity time control
・ Sensitivity training
・ Sensitization
・ Sensitization (disambiguation)
・ Sensitization (immunology)
・ Sensitize (That Petrol Emotion song)
・ Sensitizer
・ Sensitometry
・ Senska
・ SensMe
・ Senso


Dictionary Lists
翻訳と辞書 辞書検索 [ 開発暫定版 ]
スポンサード リンク

Sensitivity auditing : ウィキペディア英語版
Sensitivity auditing
Sensitivity auditing is an extension of sensitivity analysis for use in policy relevant modelling studies. Its use is recommended 〔Saltelli, A., van der Sluijs, J., Guimarães Pereira, Â., 2013, Funtowiz, S.O., What do I make of your Latinorum? Sensitivity auditing of mathematical modelling, International Journal Foresight and Innovation Policy, 9 (2/3/4), 213–234.〕 when a sensitivity analysis (SA) of a model-based study is meant to underpin an inference, and to certify its robustness, in a context where the inference feeds into a policy or decision making process. In these cases the framing of the analysis itself, its institutional context, and the motivations of its author may become a matter of great importance, and a pure SA - with its emphasis on parametric uncertainty - may be seen as insufficient. The emphasis on the framing may derive inter-alia from the relevance
of the policy study to different constituencies that are characterized by different norms and values, and hence by a different story about `what the problem is' and foremost about `who is telling the story'. Most often the framing includes more or less implicit assumptions, which could be political (e.g. which group needs to be protected) all the way to technical (e.g. which variable can be treated as a constant).
In order to take these concerns into due consideration the instruments of sensitivity analysis have been extended to provide an assessment of the entire knowledge and model generating process. This approach has been called sensitivity auditing. It takes inspiration from NUSAP,〔Van der Sluijs JP, Craye M, Funtowicz S, Kloprogge P, Ravetz J, Risbey J (2005) Combining quantitative and qualitative measures of uncertainty in model based environmental assessment: the NUSAP system. Risk Analysis 25(2):481-492〕 a method used to qualify the worth of quantitative information with the generation of `Pedigrees' of numbers. Likewise, sensitivity auditing has been developed to provide pedigrees of models and model-based inferences.〔 Sensitivity auditing has been especially designed for an adversarial context, where not only the nature of the evidence, but also the degree of certainty and uncertainty associated to the evidence, will be the subject of partisan interests.
==Approach==
Sensitivity auditing is structured along a set of seven rules/imperatives:
# Check against the rhetorical use of mathematical modeling. Question addressed: is the model being used to elucidate or to obfuscate?;
# Adopt an `assumption hunting' attitude. Question addressed: what was `assumed out'? What are the tacit, pre-analytical, possibly normative assumptions underlying the analysis?;
# Detect Garbage In Garbage Out (GIGO). Issue addressed: artificial deflation of uncertainty operated in order to achieve a desired inference at a desired level of confidence. It also works on the reverse practice, the artificial inflation of uncertainties, e.g. to deter regulation;
# Find sensitive assumptions before they find you. Issue addressed: anticipate criticism by doing careful homework via sensitivity and uncertainty analyses before publishing results.
# Aim for transparency. Issue addressed: stakeholders should be able to make sense of, and possibly replicate, the results of the analysis;
# Do the right sums, which is more important than `Do the sums right'. Issue addressed: is the viewpoint of a relevant stakeholder being neglected? Who decided that there was a problem and what the problem was?
# Focus the analysis on the key question answered by the model, exploring the entire space of the assumptions holistically. Issue addressed: don't perform perfunctory analyses that just `scratch the surface' of the system's potential uncertainties.
The first rule looks at the instrumental use of mathematical modeling to advance one's agenda. This use is called rhetorical, or strategic, like the use of Latin to confuse or obfuscate an interlocutor.
The second rule about `assumption hunting' is a reminder to look for what was assumed when the model was originally framed. Modes are full of caeteris paribus assumptions, meaning that e.g. in economics the model can predict the result of a shock to a given set of equations assuming that all the rest - all other input variables and inputs - remains equal, but in real life caeteris are never paribus, meaning by this that variables tend to be linked with one another, so that they can hardly change in isolation.
Rule three is about artificially exaggerating or playing down uncertainties wherever convenient. The tobacco lobbies exaggerated the uncertainties about the health effects of smoking according to Oreskes and Conway,〔Oreskes N, Conway EM (2010) Merchants of Doubt: How a Handful of Scientists Obscured the Truth on Issues from Tobacco Smoke to Global Warming. Bloomsbury Press, New York.〕 while advocates of the death penalty played down the uncertainties in the negative relations between capital punishment and crime rate.〔Leamer EE (2010) Tantalus on the road to asymptopia. Journal of Economic Perspectives 4(2):31-46.〕 Clearly the latter wanted the policy, in this case the death penalty, and were interested in showing that the supporting evidence was robust. In the former case the lobbies did not want regulation (e.g. bans on tobacco smoking in public places) and were hence interested in amplifying the uncertainty in the smoking-health effect causality relationship.
Rule four is about `confessing' uncertainties before going public with the analysis. This rule is also one of the commandments of applied econometrics according to
Kennedy:〔Kennedy, P. (2007) A Guide to Econometrics, 5th ed., p.396, Blackwell Publishing, Oxford.〕 `Thou shall confess in the presence of sensitivity. Corollary: Thou shall anticipate criticism'. According to this rule a sensitivity analysis should be performed before the result of a modeling study are published. There are many good reasons for doing this, one being that a carefully performed sensitivity analysis often uncovers plain coding mistakes or model inadequacies. The other is that most often than not the analysis reveal uncertainties that are larger than those anticipated by the model developers.
Rule five is about presenting the results of the modeling study in a transparent fashion. Both rules originate from the practice of impact assessment, where a modeling study presented without a proper SA, or as originating from a model which is in fact a black box, may end up being rejected by stakeholders.〔Saltelli, A., Funtowicz, S., 2014, When all models are wrong: More stringent quality criteria are needed for models used at the science-policy interface, Issues in Science and Technology, Winter 2014, 79-85.〕 Both rules four and five suggest that reproducibility may be a condition for transparency and that this latter may be a condition for legitimacy.〔Saltelli, A., Funtowicz, S., 2015 Evidence-based policy at the end of the Cartesian Dream: The case of mathematical modelling, in "The end of the Cartesian dream", Edited by Ângela Guimarães Pereira, and Silvio Funtowicz, Routledge, p. 147-162.〕
Rule six, about doing the right sum, is not far from the `assumptions hunting' rule; it is just more general. It deals with the fact that often an analyst is set to work on an analysis arbitrarily framed to the advantage of a party. Sometime this comes via the choice of the discipline selected to do the analysis. Thus an environmental impact problem may be framed through the lenses of economics, and presented as a cost benefit or risk analysis, while the issue has little to do with costs or benefits or risks and a lot to do with profits, controls, and norms. An example is in Marris et al.〔Marris, C., Wynne, B., Simmons, P., and Weldon, Sue. 2001. Final Report of the PABE Research Project Funded by the Commission of European Communities, Contract number: FAIR CT98-3844 DG12-SSMI) Dec, Lancaster: University of Lancaster.
〕 on the issue of GMOs, mostly presented in the public discourse as a food safety issue while the spectrum of concerns of GMO opponents - including lay citizens - appears broader.
Rule seven is about avoiding a perfunctory sensitivity analysis. An SA where each uncertain input is moved at a time while leaving all other inputs fixed is perfunctory.〔Saltelli, A., Annoni, P., 2010, How to avoid a perfunctory sensitivity analysis, Environmental Modeling and Software, 25, 1508-1517〕 A true SA should make an honest effort at activating all uncertainties simultaneously, leaving the model free to display its full nonlinear and possibly non-additive behaviour. A similar point is made in Sam L. Savage's book `The flaw of averages'.〔Savage SL (2009) The Flaw of Averages: Why We Underestimate Risk in the Face of Uncertainty, Wiley.〕

抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)
ウィキペディアで「Sensitivity auditing」の詳細全文を読む



スポンサード リンク
翻訳と辞書 : 翻訳のためのインターネットリソース

Copyright(C) kotoba.ne.jp 1997-2016. All Rights Reserved.